Pii: S0893-6080(00)00026-5

نویسندگان

  • A. Hyvärinen
  • E. Oja
چکیده

A fundamental problem in neural network research, as well as in many other disciplines, is finding a suitable representation of multivariate data, i.e. random vectors. For reasons of computational and conceptual simplicity, the representation is often sought as a linear transformation of the original data. In other words, each component of the representation is a linear combination of the original variables. Well-known linear transformation methods include principal component analysis, factor analysis, and projection pursuit. Independent component analysis (ICA) is a recently developed method in which the goal is to find a linear representation of non-Gaussian data so that the components are statistically independent, or as independent as possible. Such a representation seems to capture the essential structure of the data in many applications, including feature extraction and signal separation. In this paper, we present the basic theory and applications of ICA, and our recent work on the subject. q 2000 Published by Elsevier Science Ltd.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Independent component analysis: algorithms and applications

A fundamental problem in neural network research, as well as in many other disciplines, is finding a suitable representation of multivariate data, i.e. random vectors. For reasons of computational and conceptual simplicity, the representation is often sought as a linear transformation of the original data. In other words, each component of the representation is a linear combination of the origi...

متن کامل

Pii: S0893-6080(00)00062-9

This article gives an overview of the different functional brain imaging methods, the kinds of questions these methods try to address and some of the questions associated with functional neuroimaging data for which neural modeling must be employed to provide reasonable answers. q 2000 Published by Elsevier Science Ltd.

متن کامل

Pii: S0893-6080(00)00043-5

It is demonstrated that rotational invariance and reflection symmetry of image classifiers lead to a reduction in the number of free parameters in the classifier. When used in adaptive detectors, e.g. neural networks, this may be used to decrease the number of training samples necessary to learn a given classification task, or to improve generalization of the neural network. Notably, the symmet...

متن کامل

Assessing interactions among neuronal systems using functional neuroimaging

We show that new methods for measuring effective connectivity allow us to characterise the interactions between brain regions that underlie the complex interactions among different processing stages of functional architectures.

متن کامل

Best approximation by Heaviside perceptron networks

In Lp-spaces with p an integer from [1, infinity) there exists a best approximation mapping to the set of functions computable by Heaviside perceptron networks with n hidden units; however for p an integer from (1, infinity) such best approximation is not unique and cannot be continuous.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000